Neighborhood classifiers
نویسندگان
چکیده
K nearest neighbor classifier (K-NN) is widely discussed and applied in pattern recognition and machine learning, however, as a similar lazy classifier using local information for recognizing a new test, neighborhood classifier, few literatures are reported on. In this paper, we introduce neighborhood rough set model as a uniform framework to understand and implement neighborhood classifiers. This algorithm integrates attribute reduction technique with classification learning. We study the influence of the three norms on attribute reduction and classification, and compare neighborhood classifier with KNN, CART and SVM. The experimental results show that neighborhood-based feature selection algorithm is able to delete most of the redundant and irrelevant features. The classification accuracies based on neighborhood classifier is superior to K-NN, CART in original feature spaces and reduced feature subspaces, and a little weaker than SVM. 2006 Elsevier Ltd. All rights reserved.
منابع مشابه
An Evolutionary Approach To Cascade Multiple Classifiers: A Case-Study To Analyze Textual Content Of Medical Records And Identify Potential Diagnosis
This paper describes an experiment where classifiers are used to identify potential diagnoses on examining textual content of medical records. Three classifiers are applied separately (k-nearest neighborhood, multilayer perceptron and support vector machines) and also combined in two different approaches (parallel and cascading); results show that even accuracy point to a best alternative, ROC ...
متن کاملDesign of Nearest Neighborhood Ensemble for Enhanced Accuracies to Diabetes Data Set
The design of an ensemble guarantees success, only when its base classifiers make both limited errors as well as high accuracy. We address the problem of achieving the possible enhancement of accuracy of the learning models for diabetes data set.In this paper we design an ensemble by four types of Meta level classifiers for this purpose. The base classifiers for feeding the ensemble we have pro...
متن کاملDiversity in Dynamic Class Prediction
Instead of using the same ensemble for all data instances, recent studies have focused on dynamic ensembles in which a new ensemble is chosen from a pool of classifiers for each new data instance. Classifiers agreement in the region where a new data instance resides in has been considered as a major factor in dynamic ensembles. We postulate that the classifiers chosen for a dynamic ensemble sho...
متن کاملFeature Extraction for Dynamic Integration of Classifiers
Recent research has shown the integration of multiple classifiers to be one of the most important directions in machine learning and data mining. In this paper, we present an algorithm for the dynamic integration of classifiers in the space of extracted features (FEDIC). It is based on the technique of dynamic integration, in which local accuracy estimates are calculated for each base classifie...
متن کاملAdaptive neighborhood granularity selection and combination based on margin distribution optimization
Granular computing aims to develop a granular view for interpreting and solving problems. The model of neighborhood rough sets is one of effective tools for granular computing. This model can deal with complex tasks of classification learning. Despite the success of the neighborhood model in attribute reduction and rule learning, it still suffers from the issue of granularity selection. Namely,...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Expert Syst. Appl.
دوره 34 شماره
صفحات -
تاریخ انتشار 2008